A half century ago, psychologist George Miller (1969) memorably exhorted his colleagues to “give psychology away.” He proposed that a cadre of “psychological technologists” educate the general public in the “new conception of man that is emerging from our science” (p. 1070) and work directly to solve the practical problems encountered “in schools, hospitals, prisons, industries” (p. 1074). Of course, long before Miller's impassioned plea, many psychologists were giving public lectures, writing popular books, and consulting far from the ivory tower. The results were arguably mixed. John Watson (1928), father of behaviorism, warned parents not to show any affection toward their children beyond a handshake, and several years earlier a group of patriotic psychologists developed intelligence tests for World War I that were poorly administered and probably of no use to the Army (Samelson, 1977). Even before that, Harvard's Hugo Münsterberg made so many embarrassing public comments trading on his psychological expertise that the Harvard administration brainstormed about how to keep him quiet (Benjamin, 2006). All that is just one side of the ledger, but it does argue for some humility, and maybe even skepticism. The appeal of public interest—and sometimes a bit of fame—is strong for many psychologists and can tempt even the most serious scientists to overstep the boundaries of their expertise and claim to be able to change the world.Jesse Singal's new book The Quick Fix takes on several popularized ideas in psychology from the past few decades, including the self-esteem movement, the identification of “implicit biases,” positive psychology interventions, “power pose” effects on performance, and the cultivation of “grit.” Singal is a journalist, not a psychologist, and he turns a critical eye to the offerings that psychologists have given away (or, just as often, sold at some expense) to the public. In each case, Singal shows that the supposedly supportive research was either equivocal, poorly performed, misinterpreted, or nonexistent but applied at a large scale anyway. For instance, empirical research found that self-esteem sometimes correlated with positive life outcomes, but at other times it did not, and the relationship could never be shown to be causal. However, a California state task force wrote a one-sided summary of that research, and soon schools across the country were teaching lessons in self-esteem. Similarly, a positive psychology intervention had only equivocal effects on depression symptoms in middle school students. However, the program was adopted (in a modified version) by the U.S. Army, to prevent PTSD in soldiers fighting in Iraq and Afghanistan, without any evidence that it would work. The version of the implicit association test used to detect purported preferences for White over Black people will yield widely varying results depending on the day that someone takes the test (it has a test–retest reliability of 0.50 or less). Even so, it is now used in countless programs in universities and corporations to try to show people that they are (implicitly) biased.Singal places his narratives in the context of the “replication crisis” that started in about 2010 and identified a number of “questionable research practices” (QRPs), such as changing hypotheses after knowing the results of a study, failing to report all of the outcomes or conditions in a study, and collecting additional data until a desired statistical result is achieved (see John, Loewenstein, & Prelec, 2012). For many psychologists, these practices had been taken for granted and were learned either explicitly or by example from mentors. They led to publications, job security, and even prestige. Often, before the replication crisis, a failure to obtain data supporting expected conclusions was taken to indicate incompetence in data collection, and a failure to obtain statistically significant results was seen as laziness (not looking hard enough). Of course, this approach led to massive bias in which results were published, and eventually the bubble burst. A number of events precipitated the crisis, including the publication of an article (Bem, 2011) finding support for extrasensory perception. That the article had found support for physically impossible phenomena by using standard methodological and statistical practices cast serious doubt on those practices.Singal discusses, in nontechnical language, the QRPs and other methodological problems that underlie many of the “fad psychology” concepts he discusses. He does not always phrase things as a psychologist would, but by and large his descriptions are impressively accurate while staying accessible to the general reader. For instance, he describes how the original research suggesting that a certain posture (the power pose) made people take bolder risks, feel more powerful, and even secrete more testosterone was based on “p-hacked” results where data were analyzed in different ways until statistically significant analyses were obtained. Similarly, the research claiming that psychological “grit” was a uniquely powerful predictor of performance relied on samples where variability in other predictors was artificially restricted or where the performance outcomes themselves showed very little variability.It is hard to discuss QRPs without sounding a bit moralistic, and although Singal's tone is generally neutral and dispassionate, his presentation of the evidence against certain research programs can be damning. Although at times his descriptions of individual researchers seem harsh, I cannot call those descriptions unfair. Singal praises researchers who are willing to acknowledge limitations of their earlier work, and he notes the continuing controversies over some issues. He understands that the standards for psychological science have changed over time, and he expects that researchers will be open and transparent about any deficiencies in their past work. He seems to reserve his harshest judgments for researchers who dig their heels in or, even worse, those who write peer-reviewed journal articles full of circumspect claims and cautious conclusions while offering popular books and TED talks containing bold and reckless assertions supposedly based squarely in their scientific research.We might ask: Why is there so much eagerness to take up ideas that are ultimately found to rest on weak empirical foundations, at times being wholly discredited by better research? The Quick Fix has an answer, and it is in the title: We would like to believe that complex and serious social problems can be fixed with simple, rapid, easy psychological interventions, that society can be changed quickly and painlessly by changing the way we think. Singal's critique of this approach is, in a sense, from the (political) left. He finds it not just implausible but insulting to insinuate that poor children in neglected schools need more “grit” rather than more money. He has a similar reaction to the idea that centuries of explicit, frank, open racism seen in laws and policies could be addressed in any significant way by training people to reduce their implicit biases, to press a keyboard key a second quicker when seeing a Black face. In essence, Singal finds sociology more powerful than psychology. In a book that seems made for the current cultural moment in the United States, he attributes inequality in society to systemic, structural factors, not mental ones.Singal's fairly open political stance has a clear advantage: He can show that he is in political alignment with the researchers he critiques, while questioning their specific proposed methods for addressing the social problems that require attention. No one who reads The Quick Fix can doubt Singal's concern for racial and gender equity or his sympathy for veterans with PTSD. Implicitly (no pun intended), he uses his politics to explain why it matters so much that solutions to these problems be based on valid research. This is a helpful way of countering a common reaction to criticism, that (for instance) those who would dare criticize implicit bias trainings don't care about racism. Indeed, it may be that the politics of academic psychology, which skew left (Gross & Simmons, 2014), prevent more skepticism of research that is lauded just because it addresses issues related to inequality, even if the research is of questionable validity.Of course, despite his protests to the contrary, it is still possible that Singal's own critics will worry that he is giving ammunition to conservatives or that he is one in disguise. Perhaps it was these concerns that led Singal to include an unusual chapter in his book on “superpredators.” Rising crime rates saw a panic in the 1990s over the projected emergence of a large cohort of sociopathic young men who were set to go on wanton killing sprees. The prophesized generation of superpredators never came (instead, violent crime rates fell significantly), and in retrospect, the research that the prophecy was based on had significant errors. But the superpredator idea was unlike the other ideas profiled in The Quick Fix—namely, to find an example of bad social science on the right, Singal had to look outside psychology and to an idea whose originators came to disavow it entirely, leaving it to die a quick death.One of the dangers of criticizing so many academic research programs is that at some point, they begin to feel less like a few bad apples and more like a rotten tree. Indeed, when the replication crisis first hit, the reaction of many critics was to say that whole subfields of psychology were built on sand. (I remember one prominent professor wondering whether he should give refunds to the students who had taken his social psychology courses.) For much of The Quick Fix, this issue lurks in the background. Even when more recent research begins to question earlier findings, the reader might ask, “How well is the more recent research done? Will it replicate any better?”In a chapter titled “Non-Replicable,” Singal accessibly describes some of the solutions that methodological reformers have advocated. For instance, researchers can publicly post their “preregistered” hypotheses and planned statistical analyses, so that once the data are in, the researchers can't capitalize on chance and analyze the data in different ways until significant results appear. Relatedly, journals can offer peer review of study designs and statistical analysis plans and (if appropriate) commit to accepting the article once the data are in; this will keep articles from getting rejected just because the results aren't statistically significant or otherwise go against reviewers’ expectations. Even lowering the required p value for publication would do some work toward controlling the rate of false positive findings in the literature. However, it's not always clear which (if any) of these strategies, or the others that are reviewed, were used in the newer research that debunked the targets of Singal's critique.In some ways, the real “quick fix” may be our desire for quick knowledge via summaries and second-hand reports. Singal shows how popularized versions of research often distort findings and don't convey any of the limitations of the studies being described. In the age of social media, this problem may have worsened, as headlines and soundbites can be shared easily by millions of people who never even read the brief second-hand reports. These trends, along with the incentives that researchers have to overstate findings, combine to make misinformation common among people who believe themselves to be “following the research.” There are really no shortcuts to accurate information, no alternatives to carefully and critically reviewing the primary source literature. Unfortunately, this is something that few people have the time and training to do. We typically defer to peer reviewers and editors, meta-analyses, and expert panels, but these solutions are all far from perfect.An alternative approach would be to change the culture of science so that the original findings are more trustworthy and so that scientists are more careful in drawing conclusions and generalizations from those findings. There are signs of a new culture developing, at least among pockets in the psychology community. Singal notes one such sign in the conclusion to The Quick Fix: As the COVID-19 pandemic took shape, some psychologists rushed to offer advice to the world based on behavioral science findings, but one group of psychologists wrote an article (IJzerman et al., 2020) urging caution and expressing skepticism that behavioral science was “mature” enough to provide much useful guidance. More generally, the statistical and methodological reforms mentioned earlier, and the “open science” movement in psychology as a whole (e.g., Spellman, Gilbert, & Corker, 2018), have the potential to increase the trust that laypeople can have in psychologists and that we psychologists can have in each other. Much remains to be worked out about how best to improve psychology research, but the widespread interest in doing so is itself a positive development. Singal is aware of all this, and he ends the book on a positive note.The Quick Fix is an excellent book. Singal writes clearly and engagingly, and he shows deep understanding of many technical issues, an impressive feat for a journalist. There are occasions when his accessible way of presenting research results might trouble pedantic methodologists, but I would not fault him for this: Given that the book is directed at the intelligent layperson, I believe that he struck the right balance between technical precision and readability, and I would not hesitate to recommend the book to psychology students and researchers either. I only found one clear error: on page 280, Singal cites a book coauthored by “the late, great social psychologist Elliot Abramson” that was in fact coauthored by the still-alive Elliot Aronson. I was impressed that I did not find more errors, and at times I was already familiar with the works Singal was discussing. All that said, I must acknowledge that The Quick Fix takes a certain side in the debates that the book covers, and my own areas of research expertise are not represented in those debates.Now it will be interesting to see what effect the book has. Much of the developing conversation about the replication crisis and open science in psychology has taken place over the past decade on social media—blogs, Twitter feeds, and the like—so the reaction may be visible to the public. Will The Quick Fix lead to more skepticism toward the specific research programs discussed therein? Will it decrease trust or reliance on psychology studies more generally? And will it lead to replies from researchers, perhaps containing effective rejoinders? The conversation will continue, one way or another, and Singal has provided a valuable contribution to that conversation.